Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Abstract. Attention plays a key role in the improvement of sequence-to-sequence-based document summarization models. To obtain a powerful attention helping ...
Oct 25, 2019 · We propose an attention refinement unit paired with local variance loss to impose supervision on the attention model at each decoding step, and ...
This work augments the vanilla attention model from both local and global aspects with an attention refinement unit paired with local variance loss to ...
Apr 21, 2024 · We propose an attention refinement unit paired with local variance loss to impose supervision on the attention model at each decoding step, and ...
Abstract. Attention plays a key role in the improvement of sequence-to-sequence-based document summarization models. To obtain a powerful attention helping ...
Apr 21, 2024 · The experiments show that our proposed model significantly improves the abstractive summarization performance compared to the state-of-the-art ...
Mar 13, 2023 · ABSTRACT Summarization generates a brief and concise summary which portrays the main idea of the source text. There are two forms of ...
People also ask
EMNLP-IJCNLP2019 Presentation Session 3C: Discourse, Summarization, & Generation Title: Attention Optimization for Abstractive Document Summarization ...
Jun 21, 2021 · Based on it, we develop a HITS-based attention mechanism in this paper, which fully leverages sentence-level and word-level information by ...
Attention Optimization for Abstractive Document Summarization ... Attention plays a key role in the improvement of sequence-to-sequence-based document ...